712 research outputs found
Industry 4.0 Concepts and Lean Methods Mitigating Traditional Losses in Engineer-to-Order Manufacturing with Subsequent Assembly On-Site: A Framework
Abstract Engineer-to-Order companies design and manufacture complex products based on specific customer requirements. Their project-driven processes and non-repetitive production causes various inefficiencies, which lead to productivity losses. Conventional approaches such as Lean Manufacturing and Lean Construction are limited in mitigating these losses due to their challenging implementation in the Engineer-to-Order environment. New concepts and technologies from Industry 4.0 have the potential to mitigate these losses through digitizing processes but are little researched in the Engineer-to-Order industry. This article classifies traditional losses from Engineer-to-Order manufacturing companies and through literature review identifies several Lean as well as Industry 4.0 methods that have the potential to mitigate these losses. The results are presented in a framework which can be used to develop a Lean and Industry 4.0 assessment tool for companies supporting the implementation of these concepts to mitigate the presented loss categories. Further research should focus on validating the framework with empirical data
How deep is deep enough? -- Quantifying class separability in the hidden layers of deep neural networks
Deep neural networks typically outperform more traditional machine learning
models in their ability to classify complex data, and yet is not clear how the
individual hidden layers of a deep network contribute to the overall
classification performance. We thus introduce a Generalized Discrimination
Value (GDV) that measures, in a non-invasive manner, how well different data
classes separate in each given network layer. The GDV can be used for the
automatic tuning of hyper-parameters, such as the width profile and the total
depth of a network. Moreover, the layer-dependent GDV(L) provides new insights
into the data transformations that self-organize during training: In the case
of multi-layer perceptrons trained with error backpropagation, we find that
classification of highly complex data sets requires a temporal {\em reduction}
of class separability, marked by a characteristic 'energy barrier' in the
initial part of the GDV(L) curve. Even more surprisingly, for a given data set,
the GDV(L) is running through a fixed 'master curve', independently from the
total number of network layers. Furthermore, applying the GDV to Deep Belief
Networks reveals that also unsupervised training with the Contrastive
Divergence method can systematically increase class separability over tens of
layers, even though the system does not 'know' the desired class labels. These
results indicate that the GDV may become a useful tool to open the black box of
deep learning
Shear Bond Strength of Nine Dual Cured Build-Up Materials and a Light-Curing Adhesive System on Dentin
The purpose of this study was to measure the shear bond strength of nine different commercially available dual cured core build-up materials on human dentin in conjunction with a light curing adhesive system. The null hypothesis was that there is no statistical significant difference among the core build-up systems
Effects of Different Levels of Abstraction Simulating Heat Sources in FEM Considering Drilling
AbstractThis paper presents a comparison of three different methods of simulating heat sources in 3D-FEM-simulations with various levels of abstraction for drilling. The investigated methods are modelled and evaluated with respect to calculation time and accuracy of simulated temperature fields and phase transformations. Results are showing a significant variance of the maximum temperature and temperature distribution for the three different heat sources although the same amount of energy is used in the simulation model. According to the longest simulation time the most detailed heat source provides a realistic temperature distribution
- …